On the convergence of the Newton/log-barrier method

نویسنده

  • Stephen J. Wright
چکیده

In the Newton/log-barriermethod,Newton steps are taken for the log-barrier function for a xed value of the barrier parameter until a certain convergence criterion is satis ed. The barrier parameter is then decreased and the Newton process is repeated. A naive analysis indicates that Newton's method does not exhibit superlinear convergence to the minimizer of each instance of the log-barrier function until it reaches a very small neighborhood of the minimizer. By partitioning according to the subspace of active constraint gradients, however, we show that this neighborhood is actually quite large, thus explaining why reasonably fast local convergence can be attained in practice. Moreover, we show that the overall convergence rate of the Newton/log-barrier algorithm is superlinear in the number of function/derivative evaluations, provided that the nonlinear program is formulatedwith a linear objective and that the schedule for decreasing the barrier parameter is related in a certain way to the convergence criterion for each Newton process.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence of primal-dual solutions for the nonconvex log-barrier method without LICQ

This paper characterizes completely the behavior of the logarithmic barrier method under a standard second order condition, strict (multivalued) complementarity and MFCQ at a local minimizer. We present direct proofs, based on certain key estimates and few well-known facts on linear and parametric programming, in order to verify existence and Lipschitzian convergence of local primal-dual soluti...

متن کامل

Modify the linear search formula in the BFGS method to achieve global convergence.

<span style="color: #333333; font-family: Calibri, sans-serif; font-size: 13.3333px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: justify; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; background-color: #ffffff; text-dec...

متن کامل

On the convergence speed of artificial neural networks in‎ ‎the solving of linear ‎systems

‎Artificial neural networks have the advantages such as learning, ‎adaptation‎, ‎fault-tolerance‎, ‎parallelism and generalization‎. ‎This ‎paper is a scrutiny on the application of diverse learning methods‎ ‎in speed of convergence in neural networks‎. ‎For this aim‎, ‎first we ‎introduce a perceptron method based on artificial neural networks‎ ‎which has been applied for solving a non-singula...

متن کامل

An efficient improvement of the Newton method for solving nonconvex optimization problems

‎Newton method is one of the most famous numerical methods among the line search‎ ‎methods to minimize functions. ‎It is well known that the search direction and step length play important roles ‎in this class of methods to solve optimization problems. ‎In this investigation‎, ‎a new modification of the Newton method to solve ‎unconstrained optimization problems is presented‎. ‎The significant ...

متن کامل

Global convergence of an inexact interior-point method for convex quadratic‎ ‎symmetric cone programming‎

‎In this paper‎, ‎we propose a feasible interior-point method for‎ ‎convex quadratic programming over symmetric cones‎. ‎The proposed algorithm relaxes the‎ ‎accuracy requirements in the solution of the Newton equation system‎, ‎by using an inexact Newton direction‎. ‎Furthermore‎, ‎we obtain an‎ ‎acceptable level of error in the inexact algorithm on convex‎ ‎quadratic symmetric cone programmin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Math. Program.

دوره 90  شماره 

صفحات  -

تاریخ انتشار 2001